Gesture Recognition
Bridging the Communication Gap: Artificial Agents Learning Sign Language through Imitation
Tavella, Federico, Galata, Aphrodite, Cangelosi, Angelo
Artificial agents, particularly humanoid robots, interact with their environment, objects, and people using cameras, actuators, and physical presence. Their communication methods are often pre-programmed, limiting their actions and interactions. Our research explores acquiring non-verbal communication skills through learning from demonstrations, with potential applications in sign language comprehension and expression. In particular, we focus on imitation learning for artificial agents, exemplified by teaching a simulated humanoid American Sign Language. We use computer vision and deep learning to extract information from videos, and reinforcement learning to enable the agent to replicate observed actions. Compared to other methods, our approach eliminates the need for additional hardware to acquire information. We demonstrate how the combination of these different techniques offers a viable way to learn sign language. Our methodology successfully teaches 5 different signs involving the upper body (i.e., arms and hands). This research paves the way for advanced communication skills in artificial agents.
- Asia > Japan > Honshū > Chūbu > Ishikawa Prefecture > Kanazawa (0.04)
- North America > United States > New York > New York County > New York City (0.04)
- Europe > United Kingdom (0.04)
- (2 more...)
- Information Technology > Artificial Intelligence > Robots (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Undirected Networks > Markov Models (0.46)
- Information Technology > Artificial Intelligence > Machine Learning > Pattern Recognition > Gesture Recognition (0.40)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (0.34)
A real-time Artificial Intelligence system for learning Sign Language
A primary challenge for the deaf and hearing-impaired community stems from the communication gap with the hearing society, which can greatly impact their daily lives and result in social exclusion. To foster inclusivity in society, our endeavor focuses on developing a cost-effective, resource-efficient, and open technology based on Artificial Intelligence, designed to assist people in learning and using Sign Language for communication. The analysis presented in this research paper intends to enrich the recent academic scientific literature on Sign Language solutions based on Artificial Intelligence, with a particular focus on American Sign Language (ASL). This research has yielded promising preliminary results and serves as a basis for further development.
- North America > United States (0.14)
- Europe > Switzerland > Zürich > Zürich (0.14)
- Europe > United Kingdom > England > Oxfordshire > Oxford (0.04)
- (2 more...)
- Health & Medicine (1.00)
- Education > Curriculum > Subject-Specific Education (1.00)
- Information Technology > Artificial Intelligence > Vision (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (0.48)
- Information Technology > Artificial Intelligence > Machine Learning > Pattern Recognition > Gesture Recognition (0.40)
Artificial Intelligence for learning Sign Language
This story began in Madrid, Spain. The winter was comming, and a team of four young enthusiasts started a project. The initial idea was to create an app to learn Sign Language, not only because it is an interesting aspect of our society, but for those 34 million children with disabling hearing loss that need to learn it to communicate. The beauty of technology is that it can be used to help others too. We aimed to do exactly that.
- Information Technology > Artificial Intelligence > Machine Learning > Pattern Recognition > Gesture Recognition (0.40)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (0.31)
Learning sign language could give you super vision
Researchers have found that learning sign language can be beneficial for hearing adults too, giving them faster reaction times in their peripheral vision. Improved peripheral vision is useful in many sports and for driving, making you more alert to changes in your peripheral field of vision. The research also found that deaf adults have far better peripheral vision and reaction times than both hearing adults and hearing adults who use sign language. Researchers at the University of Sheffield have found that learning sign language can be beneficial for hearing adults too, giving them faster reaction times in their peripheral vision. The research, conducted at the University of Sheffield's Academic Unit of Opthalmology, found that adults learning a visual-spatial language such as British sign language (BSL) had a positive impact on their visual field response.